Efficient Pruning Method for Ensemble Self-Generating Neural Networks

نویسندگان

  • Hirotaka INOUE
  • Hiroyuki NARIHISA
چکیده

Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose an efficient pruning method for the structure of the SGNN in the MCS. We compare the pruned MCS with two sampling methods. Experiments have been conducted to compare the pruned MCS with an unpruned MCS, the MCS based on C4.5, and k-nearest neighbor method. The results show that the pruned MCS can improve its classification accuracy as well as reducing the computation cost.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Experiments with an Ensemble Self-Generating Neural Network

In an earlier paper, we introduced an ensemble model called ESGNN (ensemble self-generating neural network) which can be used to reduce the error for classification and chaotic time series prediction. Although this model can obtain the high accuracy than a single SGNN, the computational cost increase in proportion to the number of SGNN in an ensemble. In this paper, we propose a new pruning SGN...

متن کامل

Effect of Pruning and Early Stopping on Performance of a Boosting Ensemble

Generating an architecture for an ensemble of boosting machines involves making a series of design decisions. One design decision is whether to use simple “weak learners” such as decision tree stumps or more complicated weak learners such as large decision trees or neural networks. Another design decision is the training algorithm for the constituent weak learners. Here we concentrate on binary...

متن کامل

Self-generating Neural Networks

This review of recent advances conceringing growing and pruning neural networks focuses on three areas. Adaptive resonance theory networks automatically have an architecture whose size adjusts to its task. Networks that optimize resource allocation have been around nearly fifteen years and developments are still being made in growing and pruning strategies, particularly for on-line, real-time a...

متن کامل

A competitive ensemble pruning approach based on cross-validation technique

Ensemble pruning is crucial for the considerations of both efficiency and predictive accuracy of an ensemble system. This paper proposes a new Competitive measure for Ensemble Pruning based on Cross-Validation technique (CEPCV). Firstly, the data to be learnt by neural computing models are mostly drifting with time and environment, while the proposed CEPCV method can realize on-line ensemble pr...

متن کامل

An Empirical Comparison of Pruning Methods for Ensemble Classifiers

Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classif...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003